Elastic exponential linear units for convolutional neural networks

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Flexible Rectified Linear Units for Improving Convolutional Neural Networks

Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU). FReLU improves the flexibility of ReLU by a learnable rectified point. FReLU achieves a faster convergence and higher performance. Furthermore, FReLU does not rely on strict assumptions by s...

متن کامل

Improving Deep Neural Network with Multiple Parametric Exponential Linear Units

Activation function is crucial to the recent successes of neural network. In this paper, we propose a new activation function that generalizes and unifies the rectified and exponential linear units. The proposed method, named MPELU, has the advantages of PReLU and ELU. We show that by introducing learnable parameters like PReLU, MPELU provides better generalization capability than PReLU and ELU...

متن کامل

Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units

Recently, convolutional neural networks (CNNs) have been used as a powerful tool to solve many problems of machine learning and computer vision. In this paper, we aim to provide insight on the property of convolutional neural networks, as well as a generic method to improve the performance of many CNN architectures. Specifically, we first examine existing CNN models and observe an intriguing pr...

متن کامل

Improving deep convolutional neural networks with mixed maxout units

Motivated by insights from the maxout-units-based deep Convolutional Neural Network (CNN) that "non-maximal features are unable to deliver" and "feature mapping subspace pooling is insufficient," we present a novel mixed variant of the recently introduced maxout unit called a mixout unit. Specifically, we do so by calculating the exponential probabilities of feature mappings gained by applying ...

متن کامل

Continuously Differentiable Exponential Linear Units

Exponential Linear Units (ELUs) are a useful rectifier for constructing deep learning architectures, as they may speed up and otherwise improve learning by virtue of not have vanishing gradients and by having mean activations near zero [1]. However, the ELU activation as parametrized in [1] is not continuously differentiable with respect to its input when the shape parameter α is not equal to 1...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2020

ISSN: 0925-2312

DOI: 10.1016/j.neucom.2020.03.051